Skip to content

client/python: add OpenAI-compatible REST API samples#4096

Open
retryoos wants to merge 1 commit intoopenvinotoolkit:mainfrom
retryoos:openai-api-python-samples
Open

client/python: add OpenAI-compatible REST API samples#4096
retryoos wants to merge 1 commit intoopenvinotoolkit:mainfrom
retryoos:openai-api-python-samples

Conversation

@retryoos
Copy link
Copy Markdown

@retryoos retryoos commented Mar 28, 2026

🛠 Summary

JIRA/Issue if applicable: N/A (small Python client samples addition)

This PR adds OpenAI-compatible REST API Python samples under client/python/openai-api/samples/ and links them from client/python/README.md.

Added:

  • http_list_models.py for GET /v3/models
  • http_chat_completions.py for unary POST /v3/chat/completions
  • http_chat_completions_stream.py for streaming POST /v3/chat/completions (stream: true, SSE parsing)
  • README.md with setup, usage, and examples
  • requirements.txt (stdlib-only note)

Updated:

  • client/python/README.md with a link to the new OpenAI API samples section

Notes:

  • Local validation done with python3 -m py_compile and --help checks for all scripts.
  • Runtime error path was validated without a running OVMS instance (clear connection-refused behavior).
  • End-to-end runtime test depends on OVMS startup/model load.

🧪 Checklist

  • Unit tests added.
  • The documentation updated.
  • Change follows security best practices.

@dtrawins dtrawins requested review from atobiszei, mzegla and ngrozae April 1, 2026 22:10
Comment on lines +158 to +161
Sends a streaming request to the OpenVINO Model Server OpenAI-compatible chat/completions endpoint
and prints generated tokens as they arrive.

optional arguments:
Copy link
Copy Markdown
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

help output differs:

Suggested change
Sends a streaming request to the OpenVINO Model Server OpenAI-compatible chat/completions endpoint
and prints generated tokens as they arrive.
optional arguments:
Sends a streaming request to the OpenVINO Model Server OpenAI-compatible
chat/completions endpoint and prints tokens as they arrive.
options:

"index": 0,
"logprobs": null,
"message": {
"content": "The capital of France is Paris.",
Copy link
Copy Markdown
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

content differs in test:

Suggested change
"content": "The capital of France is Paris.",
"content": "The capital of France is Paris. It is not only the largest city in France but also serves as the country's political, cultural, and economic center. Paris is known for its landmarks such as the Eiffel Tower, Notre-Dame Cathedral, and the Louvre Museum, which is the world's largest art museum and a historic monument in Paris.",

please move command output to separate quotes with 'text' mark instead of 'bash'

}
],
"created": 1743000000,
"model": "ovms-model",
Copy link
Copy Markdown
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

content differs in test:

Suggested change
"model": "ovms-model",
"model": "Phi-3-mini",

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants